359 research outputs found

    The Foundations of Banks' Risk Regulation: a Review of the Literature

    Get PDF
    The stability of the banking industry around the world has been observed as periodical since the Great Depression. Financial markets have changed dramatically over the last twenty-five years, introducing more competition for and from banks. Banks are the financial institutions responsible for providing liquidity to the economy. This responsibility is, however, the main cause of their fragility. Deposit insurance is the most efficient instrument for protecting depositors and for preventing bank runs. Pricing deposit insurance according to the individual bank's risk seems to be the most appropriate strategy but it does not seem to be sufficient in the sense that it seems to remain residual information problems in the market, although there is no appropriate statistical analysis on this issue. In 1988, the G10 modified banking regulation significantly by setting capital standards for international banks. These standards have now been adopted by more than one hundred countries as part of their national regulation of banks' risk. Current regulation of bank capital adequacy has its critics because it imposes the same rules on all banks. This seems particularly unsuitable when applied to credit risk which is the major source of a bank's risk (about 70%). Moreover, diversification of a bank's credit-risk portfolio is not taken into account in the computation of capital ratios. These shortcomings seem to have distorted the behaviour of banks and this makes it much more complicated to monitor them. In fact, it is not even clear that the higher capital ratios observed since the introduction of this new form of capital regulation necessarily lower risks. Additional reform is expected in 2004, but there is as yet no consensus on othe form it will take nor on whether it will suitably regulate banks in individual countries. Consequently, it might be appropriate to continue developing national regualtion based on optimal deposit insurance (with individual insurance pricing and continuous auditing on individual risk) and to keep searching for other optimal complementary instruments for use against systemic risk, instruments suitably designed to fit the banking industry's peculiar structure. Other market discipline (such as subordinated debt) and governance instruments may be more efficient than the current capital requirement scheme for the banks' commitment problem associated to deposit insurance. The central bank should be responsible for aggregate liquidity. Confidence inthe financial sector is a public good that must be ensured by the government. Who should be in charge: the central bank or a regulatory agency? The revised literature seems to say that this role should be taken by a regulatory agency independent fromthe central bank and independent from the political power.Bank, liquidity, deposit insurance, capital standard, national regulation, credit risk, capital regulation, subordinated debt, governance, capital requirement, central bank, regulatory agency

    Structured Finance, Risk Management, and the Recent Financial Crisis

    Get PDF
    Structured finance is often mentioned as the main cause of the latest financial crisis. We argue that structured finance per se did not trigger the last financial crisis. The crisis was propagated around the world because of poor risk management such as agency problems in the securitization market, poor rating and pricing standards, rating agency incentives, lack of market transparency, the search for higher yields by top decision makers and the failure of regulators and central banks to understand the implications of the changing environment.Structured finance, risk management, financial crisis, collateral debt obligation (CDO), asset back commercial paper (ABCP), rating, pricing, securitization, regulation of financial markets

    New Evidence on the Determinants of Absenteeism Using Linked Employer-Employee Data

    Get PDF
    In this paper, we provide new evidence on the determinants of absenteeism using the Workplace Employee Survey (WES) 1999-2002 from Statistics Canada. Our paper extends the typical labour-leisure model used to analyze the decision to skip work to include firm-level policy variables relevant to the absenteeism decision and uncertainty about the cost of absenteeism. It also provides a non-linear econometric model that explicitly takes into account the count nature of absenteeism data and unobserved heterogeneity at both the individual and firm level. Controlling for very detailed demographic, job and firm characteristics (including workplace practices), we find that dissatisfaction with contracted hours is a significant determinant of absence.Absenteeism; Linked Employer-Employee Data; Unobserved Heterogeneity; Count Data Models.

    On the Necessity of Using Lottery Qualities

    Get PDF
    The aim of this paper is to propose a model of decision-making for lotteries. The key element of the theory is the use of lottery qualities. Qualities allow the derivation of optimal decision-making processes and are taken explicitly into account for lottery evaluation. Our contribution explains the major violations of the expected utility theory for decisions on two-point lotteries and shows the necessity of giving explicit consideration to the lottery qualities.Lottery choice, common ratio, preference reversal, pricing, lottery test, cognitive process, certainty equivalent, lottery quality

    Estimating the effect of a change in insurance pricing regime on accidents with endogenous mobility.

    Get PDF
    In this paper, we estimate the impact of introducing a bonus-malus system on the probability of having automobile accidents, taking into account contract duration or the client mobility between insurers. We show that the new incentive scheme reduces accident rates of all policyholders when contract duration is taken into account, but does not affect accident rates of movers that shirk the imposed incentive effects of the new insurance pricing scheme.Bonus-malus; contract duration; automobile accident; Poisson distribution; right- and left-censoring; exponential distribution.

    A Theoretical Extension of the Consumption-based CAPM Model

    Get PDF
    We extend the Consumption-based CAPM (C-CAPM) model for representative agents with different risk attitudes. We introduce the concept of expectation dependence and show that for a risk averse representative agent, it is the first-degree expectation dependence rather than the covariance that determines C-CAPM’s riskiness. We extend the assumption of risk aversion to prudence and provide a weaker dependence condition than first-degree expectation dependence to obtain the values of asset price and equity premium. Results are generalized to higher-degree risk changes and higher- order representative agents, and are linked to the equity premium puzzle.Consumption-based CAPM, Risk premium, Equity premium puzzle

    Correlated Poisson Processes with Unobserved Heterogeneity: Estimating the Determinants of Paid and Unpaid Leave

    Get PDF
    Using linked employer-employee data from the Canadian Workplace and Employee Survey 1999-2004, we provide new evidence on how the cost of absence affects labor supply decisions. We use a particular feature of the data by which total absences are divided into three separate categories: sick paid days, other paid days and unpaid days. This division introduces variations in the way workers are compensated for absence (the cost of absence) and allows us to estimate more precisely how variations in such costs affect absenteeism decisions. We find an absence elasticity of -0.37. We also find unobserved heterogeneity to play different roles for workers and workplaces: some workers are more frequently absent whatever the reason, but paid and unpaid leaves are negatively correlated at the workplace level.Absenteeism; Linked Employer-Employee Data; Unobserved Heterogeneity; Count Data Model, Correlated Random Effects

    Corporate Risk Management and Dividend Signaling Theory

    Get PDF
    This paper investigates the effect of corporate risk management on dividend policy. We extend the signaling framework of Bhattacharya (1979) by including the possibility of hedging the future cash flow. We find that the higher the hedging level, the lower the incremental dividend. This result is in line with the purpoted positive relation between information asymmetry and dividend policy (e.g., Miller and Rock, 1985) and the assertion that risk management alleviates the information asymmetry problem (e.g., DaDalt et al., 2002). Our theoretical model has testable implications.Signaling theory, Dividend policy, Risk management policy, Corporate hedging, Information asymmetry

    On Risk Management Determinants: What Really Matters?

    Get PDF
    We investigate the determinants of the risk management decision for an original dataset of North American gold mining firms. We propose explanations based on the firm's financial characteristics, managerial risk aversion and internal corporate governance mechanisms. We develop a theoretical model in which the debt and the hedging decisions are made simultaneously. Our model suggests that more hedging does not always lead to a higher debt capacity when the firm holds a standard debt contract, while hedging is an increasing function of the firm's financial distress costs. We then test the predictions of our model. To estimate our system of simultaneous Tobit equations, we extend, to panel data, the minimum distance estimator proposed by Lee (1995). We obtain that financial distress costs, information asymmetry, separation between the posts of CEO and chairman of the board positions and managerial risk aversion are important determinants of the decision to hedge whereas the composition of the board of directors has no impact in such decision. Also, our results do not support the conclusion that firms hedge in order to increase their debt capacity which seems to confirm our model's prediction.Risk management determinants, corporate hedging, capital structure, managerial risk aversion, gold price, tax incentive, minimum distance estimator, panel data, Tobit, corporate governance.

    Scaling Models for the Severity and Frequency of External Operational Loss Data

    Get PDF
    According to Basel II criteria, the use of external data is absolutely indispensable to the implementation of an advanced method for calculating operational capital. This article investigates how the severity and frequencies of external losses are scaled for integration with internal data. We set up an initial model designed to explain the loss severity. This model takes into account firm size, location, and business lines as well as risk types. It also shows how to calculate the internal loss equivalent to an external loss, which might occur in a given bank. OLS estimation results show that the above variables have significant power in explaining the loss amount. They are used to develop a normalization formula. A second model based on external data is developed to scale the frequency of losses over a given period. Two regression models are analyzed: the truncated Poisson model and the truncated negative binomial model. Variables estimating the size and geographical distribution of the banks' activities have been introduced as explanatory variables. The results show that the negative binomial distribution outperforms the Poisson distribution. The scaling is done by calculating the parameters of the selected distribution based on the estimated coefficients and the variables related to a given bank. Frequency of losses of more than $1 million are generated on a specific horizon.Operational risk in banks, scaling, severity distribution, frequency distribution, truncated count data regression models
    corecore